Bounds on Sparsity of One-Hidden-Layer Perceptron Networks
نویسنده
چکیده
Limitations of one-hidden-layer (shallow) perceptron networks to sparsely represent multivariable functions is investigated. A concrete class of functions is described whose computation by shallow perceptron networks requires either large number of units or is unstable due to large output weights. The class is constructed using pseudo-noise sequences which have many features of random sequences but can be generated using special polynomials. Connections with the central paradox of coding theory are discussed.
منابع مشابه
Predicting the Grouting Ability of Sandy Soils by Artificial Neural Networks Based On Experimental Tests
In this paper, the grouting ability of sandy soils is investigated by artificial neural networks based on the results of chemical grout injection tests. In order to evaluate the soil grouting potential, experimental samples were prepared and then injected. The sand samples with three different particle sizes (medium, fine, and silty) and three relative densities (%30, %50, and %90) were injecte...
متن کاملPrediction of breeding values for the milk production trait in Iranian Holstein cows applying artificial neural networks
The artificial neural networks, the learning algorithms and mathematical models mimicking the information processing ability of human brain can be used non-linear and complex data. The aim of this study was to predict the breeding values for milk production trait in Iranian Holstein cows applying artificial neural networks. Data on 35167 Iranian Holstein cows recorded between 1998 to 2009 were ...
متن کاملRates of approximation of real-valued boolean functions by neural networks
We give upper bounds on rates of approximation of real-valued functions of d Boolean variables by one-hidden-layer perceptron networks. Our bounds are of the form c/n where c depends on certain norms of the function being approximated and n is the number of hidden units. We describe sets of functions where these norms grow either polynomially or exponentially with d.
متن کاملAn Integral Upper Bound for Neural Network Approximation
Complexity of one-hidden-layer networks is studied using tools from nonlinear approximation and integration theory. For functions with suitable integral representations in the form of networks with infinitely many hidden units, upper bounds are derived on the speed of decrease of approximation error as the number of network units increases. These bounds are obtained for various norms using the ...
متن کاملLimitations of One-Hidden-Layer Perceptron Networks
Limitations of one-hidden-layer perceptron networks to represent efficiently finite mappings is investigated. It is shown that almost any uniformly randomly chosen mapping on a sufficiently large finite domain cannot be tractably represented by a one-hidden-layer perceptron network. This existential probabilistic result is complemented by a concrete example of a class of functions constructed u...
متن کامل